Two simple resistant regression estimators

نویسنده

  • David J. Olive
چکیده

Two simple resistant regression estimators with OP (n−1/2) convergence rate are presented. Ellipsoidal trimming can be used to trim the cases corresponding to predictor variables x with large Mahalanobis distances, and the forward response plot of the residuals versus the fitted values can be used to detect outliers. The first estimator uses ten forward response plots corresponding to ten different trimming proportions, and the final estimator corresponds to the “best” forward response plot. The second estimator is similar to the elemental resampling algorithm, but sets of O(n) cases are used instead of randomly selected elemental sets. These two estimators should be regarded as new tools for outlier detection rather than as replacements for existing methods. Outliers should always be examined ∗David J. Olive is Associate Professor, Department of Mathematics, Southern Illinois University, Mailcode 4408, Carbondale, IL 62901-4408, USA. E-mail address: [email protected]. This research was supported by NSF grant DMS 0202922. The author is grateful to the editors and referees for a number of helpful suggestions for improvement in the article.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strategy-proof estimators for simple regression

In this paper we propose a whole class of estimators (clockwise repeated median estimators or CRM) for the simple regression model that are immune to manipulation by the agents generating the data. Although strategic considerations affecting the stability of the estimated parameters in regression models have already been studied (the Lucas critique), few efforts have been made to design estimat...

متن کامل

Non-parametric entropy estimators based on simple linear regression

Estimators for differential entropy are proposed. The estimators are based on the second order expansion of the probability mass around the inspection point with respect to the distance from the point. Simple linear regression is utilized to estimate the values of density function and its second derivative at a point. After estimating the values of the probability density function at each of th...

متن کامل

Corrected Maximum Likelihood Estimators in Linear Heteroskedastic Regression Models*

The linear heteroskedastic regression model, for which the variance of the response is given by a suitable function of a set of linear exogenous variables, is very useful in econometric applications. We derive a simple matrix formula for the n biases of the maximum likelihood estimators of the parameters in the variance of the response, where n is the sample size. These biases are easily obtain...

متن کامل

Shrinkage estimators of intercept parameters of two simple regression models with suspected equal slopes

Estimators of the intercept parameter of a simple linear regression model involves the slope estimator. In this paper, we consider the estimation of the intercept parameters of two linear regression models with normal errors, when it is apriori suspected that the two regression lines are parallel, but in doubt. We also introduce a coefficient of distrust as a measure of degree of lack of trust ...

متن کامل

Estimating the error variance in nonparametric regression by a covariate-matched U-statistic

For nonparametric regression models with fixed and random design, two classes of estimators for the error variance have been introduced: second sample moments based on residuals from a nonparametric fit, and difference-based estimators. The former are asymptotically optimal but require estimating the regression function; the latter are simple but have larger asymptotic variance. For nonparametr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2005